uncertainty$86541$ - meaning and definition. What is uncertainty$86541$
Diclib.com
ChatGPT AI Dictionary
Enter a word or phrase in any language 👆
Language:

Translation and analysis of words by ChatGPT artificial intelligence

On this page you can get a detailed analysis of a word or phrase, produced by the best artificial intelligence technology to date:

  • how the word is used
  • frequency of use
  • it is used more often in oral or written speech
  • word translation options
  • usage examples (several phrases with translation)
  • etymology

What (who) is uncertainty$86541$ - definition

IN INFORMATION THEORY, THE SUM OF THE TEMPORAL AND SPECTRAL SHANNON ENTROPIES
Hirchman uncertainty; Hirschman uncertainty

Entropic uncertainty         
In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies.
Measurement uncertainty         
PARAMETER CHARACTERIZING THE DISPERSION OF QUANTITY VALUES OF A MEASURAND
Measurement Uncertainty; Measuring uncertainty; Uncertainty of measurement; Type B evaluation of uncertainty; Type A evaluation of uncertainty; Interval of uncertainty; Measurement uncertainties
In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation.
Policy uncertainty         
A CLASS OF ECONOMIC RISK WHERE THE FUTURE PATH OF GOVERNMENT POLICY IS UNCERTAIN, RAISING RISK PREMIUM
Policy Uncertainty; Regime uncertainty; Wikipedia talk:Articles for creation/Policy Uncertainty; Economic Policy Uncertainty Index
Policy uncertainty (also called regime uncertainty) is a class of economic risk where the future path of government policy is uncertain, raising risk premia and leading businesses and individuals to delay spending and investment until this uncertainty has been resolved. Policy uncertainty may refer to uncertainty about monetary or fiscal policy, the tax or regulatory regime, or uncertainty over electoral outcomes that will influence political leadership.

Wikipedia

Entropic uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations.

In 1957, Hirschman considered a function f and its Fourier transform g such that

g ( y ) exp ( 2 π i x y ) f ( x ) d x , f ( x ) exp ( 2 π i x y ) g ( y ) d y   , {\displaystyle g(y)\approx \int _{-\infty }^{\infty }\exp(-2\pi ixy)f(x)\,dx,\qquad f(x)\approx \int _{-\infty }^{\infty }\exp(2\pi ixy)g(y)\,dy~,}

where the "≈" indicates convergence in L2, and normalized so that (by Plancherel's theorem),

| f ( x ) | 2 d x = | g ( y ) | 2 d y = 1   . {\displaystyle \int _{-\infty }^{\infty }|f(x)|^{2}\,dx=\int _{-\infty }^{\infty }|g(y)|^{2}\,dy=1~.}

He showed that for any such functions the sum of the Shannon entropies is non-negative,

H ( | f | 2 ) + H ( | g | 2 ) | f ( x ) | 2 log | f ( x ) | 2 d x | g ( y ) | 2 log | g ( y ) | 2 d y 0. {\displaystyle H(|f|^{2})+H(|g|^{2})\equiv -\int _{-\infty }^{\infty }|f(x)|^{2}\log |f(x)|^{2}\,dx-\int _{-\infty }^{\infty }|g(y)|^{2}\log |g(y)|^{2}\,dy\geq 0.}

A tighter bound,

was conjectured by Hirschman and Everett, proven in 1975 by W. Beckner and in the same year interpreted as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski. The equality holds in the case of Gaussian distributions. Note, however, that the above entropic uncertainty function is distinctly different from the quantum Von Neumann entropy represented in phase space.